58 research outputs found

    Bayesian inference of spatial and temporal relations in AI patents for EU countries

    Get PDF
    In the paper, we propose two models of Artificial Intelligence (AI) patents in European Union (EU) countries addressing spatial and temporal behaviour. In particular, the models can quantitatively describe the interaction between countries or explain the rapidly growing trends in AI patents. For spatial analysis Poisson regression is used to explain collaboration between a pair of countries measured by the number of common patents. Through Bayesian inference, we estimated the strengths of interactions between countries in the EU and the rest of the world. In particular, a significant lack of cooperation has been identified for some pairs of countries. Alternatively, an inhomogeneous Poisson process combined with the logistic curve growth accurately models the temporal behaviour by an accurate trend line. Bayesian analysis in the time domain revealed an upcoming slowdown in patenting intensity.The research was supported in part by PL-Grid Infrastructure, POWER 2014–2020 program and the Polish Ministry of Science and Higher Education with the subvention funds of the Faculty of Computer Science, Electronics and Telecommunications of AGH University.Peer ReviewedPostprint (published version

    Fine-grained Network Traffic Prediction from Coarse Data

    Get PDF
    ICT systems provide detailed information on computer network traffic. However, due to storage limitations, some of the information on past traffic is often only retained in an aggregated form. In this paper we show that Linear Gaussian State Space Models yield simple yet effective methods to make predictions based on time series at different aggregation levels. The models link coarse-grained and fine-grained time series to a single model that is able to provide fine-grained predictions. Our numerical experiments show up to 3.7 times improvement in expected mean absolute forecast error when forecasts are made using, instead of ignoring, additional coarse-grained observations. The forecasts are obtained in a Bayesian formulation of the model, which allows for provisioning of a traffic prediction service with highly informative priors obtained from coarse-grained historical data

    RiskNet: neural risk assessment in networks of unreliable resources

    Get PDF
    We propose a graph neural network (GNN)-based method to predict the distribution of penalties induced by outages in communication networks, where connections are protected by resources shared between working and backup paths. The GNN-based algorithm is trained only with random graphs generated on the basis of the Barabási–Albert model. However, the results obtained show that we can accurately model the penalties in a wide range of existing topologies. We show that GNNs eliminate the need to simulate complex outage scenarios for the network topologies under study—in practice, the entire time of path placement evaluation based on the prediction is no longer than 4 ms on modern hardware. In this way, we gain up to 12 000 times in speed improvement compared to calculations based on simulations.This work was supported by the Polish Ministry of Science and Higher Education with the subvention funds of the Faculty of Computer Science, Electronics and Telecommunications of AGH University of Science and Technology (P.B., P.C.) and by the PL-Grid Infrastructure (K.R.).Peer ReviewedPostprint (published version

    Leveraging Spatial and Temporal Correlations for Network Traffic Compression

    Full text link
    The deployment of modern network applications is increasing the network size and traffic volumes at an unprecedented pace. Storing network-related information (e.g., traffic traces) is key to enable efficient network management. However, this task is becoming more challenging due to the ever-increasing data transmission rates and traffic volumes. In this paper, we present a novel method for network traffic compression that exploits spatial and temporal patterns naturally present in network traffic. We consider a realistic scenario where traffic measurements are performed at multiple links of a network topology using tools like SNMP or NetFlow. Such measurements can be seen as multiple time series that exhibit spatial and temporal correlations induced by the network topology, routing or user behavior. Our method leverages graph learning methods to effectively exploit both types of correlations for traffic compression. The experimental results show that our solution is able to outperform GZIP, the \textit{de facto} traffic compression method, improving by 50\%-65\% the compression ratio on three real-world networks.Comment: 11 pages, 14 figure

    Deep Reinforcement Learning meets Graph Neural Networks: exploring a routing optimization use case

    Full text link
    Recent advances in Deep Reinforcement Learning (DRL) have shown a significant improvement in decision-making problems. The networking community has started to investigate how DRL can provide a new breed of solutions to relevant optimization problems, such as routing. However, most of the state-of-the-art DRL-based networking techniques fail to generalize, this means that they can only operate over network topologies seen during training, but not over new topologies. The reason behind this important limitation is that existing DRL networking solutions use standard neural networks (e.g., fully connected), which are unable to learn graph-structured information. In this paper we propose to use Graph Neural Networks (GNN) in combination with DRL. GNN have been recently proposed to model graphs, and our novel DRL+GNN architecture is able to learn, operate and generalize over arbitrary network topologies. To showcase its generalization capabilities, we evaluate it on an Optical Transport Network (OTN) scenario, where the agent needs to allocate traffic demands efficiently. Our results show that our DRL+GNN agent is able to achieve outstanding performance in topologies unseen during training.Comment: 11 page

    Unveiling the potential of Graph Neural Networks for network modeling and optimization in SDN

    Get PDF
    Network modeling is a critical component for building self-driving Software-Defined Networks, particularly to find optimal routing schemes that meet the goals set by administrators. However, existing modeling techniques do not meet the requirements to provide accurate estimations of relevant performance metrics such as delay and jitter. In this paper we propose a novel Graph Neural Network (GNN) model able to understand the complex relationship between topology, routing and input traffic to produce accurate estimates of the per-source/destination pair mean delay and jitter. GNN are tailored to learn and model information structured as graphs and as a result, our model is able to generalize over arbitrary topologies, routing schemes and variable traffic intensity. In the paper we show that our model provides accurate estimates of delay and jitter (worst case R2 = 0.86) when testing against topologies, routing and traffic not seen during training. In addition, we present the potential of the model for network operation by presenting several use-cases that show its effective use in per-source/destination pair delay/jitter routing optimization and its generalization capabilities by reasoning in topologies and routing schemes not seen during training.This work was supported by AGH University of Science and Technology grant, under contract no. 15.11.230.400, the Spanish MINECO under contract TEC2017-90034-C2-1-R (ALLIANCE) and the Catalan Institution for Research and Advanced Studies (ICREA). The research was also supported in part by PL-Grid Infrastructure.Peer ReviewedPostprint (author's final draft
    • …
    corecore